5,719 research outputs found

    A large study on the effect of code obfuscation on the quality of java code

    Get PDF
    Context: Obfuscation is a common technique used to protect software against malicious reverse engineering. Obfuscators manipulate the source code to make it harder to analyze and more difficult to understand for the attacker. Although different obfuscation algorithms and implementations are available, they have never been directly compared in a large scale study. Aim: This paper aims at evaluating and quantifying the effect of several different obfuscation implementations (both open source and commercial), to help developers and project managers to decide which algorithms to use. Method: In this study we applied 44 obfuscations to 18 subject applications covering a total of 4 millions lines of code. The effectiveness of these source code obfuscations has been measured using 10 code metrics, considering modularity, size and complexity of code. Results: Results show that some of the considered obfuscations are effective in making code metrics change substantially from original to obfuscated code, although this change (called potency of the obfuscation) is different on different metrics. In the paper we recommend which obfuscations to select, given the security requirements of the software to be protected

    An empirical analysis of source code metrics and smart contract resource consumption

    Get PDF
    A smart contract (SC) is a programme stored in the Ethereum blockchain by a contract‐creation transaction. SC developers deploy an instance of the SC and attempt to execute it in exchange for a fee, paid in Ethereum coins (Ether). If the computation needed for their execution turns out to be larger than the effort proposed by the developer (i.e., the gasLimit ), their client instantiation will not be completed successfully. In this paper, we examine SCs from 11 Ethereum blockchain‐oriented software projects hosted on GitHub.com, and we evaluate the resources needed for their deployment (i.e., the gasUsed ). For each of these contracts, we also extract a suite of object‐oriented metrics, to evaluate their structural characteristics. Our results show a statistically significant correlation between some of the object‐oriented (OO) metrics and the resources consumed on the Ethereum blockchain network when deploying SCs. This result has a direct impact on how Ethereum developers engage with a SC: evaluating its structural characteristics, they will be able to produce a better estimate of the resources needed to deploy it. Other results show specific source code metrics to be prioritised based on application domains when the projects are clustered based on common themes

    A framework for the simulation of structural software evolution

    Get PDF
    This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2008 ACM.As functionality is added to an aging piece of software, its original design and structure will tend to erode. This can lead to high coupling, low cohesion and other undesirable effects associated with spaghetti architectures. The underlying forces that cause such degradation have been the subject of much research. However, progress in this field is slow, as its complexity makes it difficult to isolate the causal flows leading to these effects. This is further complicated by the difficulty of generating enough empirical data, in sufficient quantity, and attributing such data to specific points in the causal chain. This article describes a framework for simulating the structural evolution of software. A complete simulation model is built by incrementally adding modules to the framework, each of which contributes an individual evolutionary effect. These effects are then combined to form a multifaceted simulation that evolves a fictitious code base in a manner approximating real-world behavior. We describe the underlying principles and structures of our framework from a theoretical and user perspective; a validation of a simple set of evolutionary parameters is then provided and three empirical software studies generated from open-source software (OSS) are used to support claims and generated results. The research illustrates how simulation can be used to investigate a complex and under-researched area of the development cycle. It also shows the value of incorporating certain human traits into a simulation—factors that, in real-world system development, can significantly influence evolutionary structures

    Parametrizations of Inclusive Cross Sections for Pion Production in Proton-Proton Collisions

    Full text link
    Accurate knowledge of cross sections for pion production in proton-proton collisions finds wide application in particle physics, astrophysics, cosmic ray physics and space radiation problems, especially in situations where an incident proton is transported through some medium, and one requires knowledge of the output particle spectrum given the input spectrum. In such cases accurate parametrizations of the cross sections are desired. In this paper we review much of the experimental data and compare to a wide variety of different cross section parametrizations. In so doing, we provide parametrizations of neutral and charged pion cross sections which provide a very accurate description of the experimental data. Lorentz invariant differential cross sections, spectral distributions and total cross section parametrizations are presented.Comment: 32 pages with 15 figures. Published in Physical Review D62, 094030. File includes 6 tex files. The main file is paper.tex which has include statements refering to the rest. figures are in graphs.di

    HEP Applications Evaluation of the EDG Testbed and Middleware

    Full text link
    Workpackage 8 of the European Datagrid project was formed in January 2001 with representatives from the four LHC experiments, and with experiment independent people from five of the six main EDG partners. In September 2002 WP8 was strengthened by the addition of effort from BaBar and D0. The original mandate of WP8 was, following the definition of short- and long-term requirements, to port experiment software to the EDG middleware and testbed environment. A major additional activity has been testing the basic functionality and performance of this environment. This paper reviews experiences and evaluations in the areas of job submission, data management, mass storage handling, information systems and monitoring. It also comments on the problems of remote debugging, the portability of code, and scaling problems with increasing numbers of jobs, sites and nodes. Reference is made to the pioneeering work of Atlas and CMS in integrating the use of the EDG Testbed into their data challenges. A forward look is made to essential software developments within EDG and to the necessary cooperation between EDG and LCG for the LCG prototype due in mid 2003.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics Conference (CHEP03), La Jolla, CA, USA, March 2003, 7 pages. PSN THCT00

    Fluctuations in Hadronic and Nuclear Collisions

    Get PDF
    We investigate several fluctuation effects in high-energy hadronic and nuclear collisions through the analysis of different observables. To introduce fluctuations in the initial stage of collisions, we use the Interacting Gluon Model (IGM) modified by the inclusion of the impact parameter. The inelasticity and leading-particle distributions follow directly from this model. The fluctuation effects on rapidity distributions are then studied by using Landau's Hydrodynamic Model in one dimension. To investigate further the effects of the multiplicity fluctuation, we use the Longitudinal Phase-Space Model, with the multiplicity distribution calculated within the hydrodynamic model, and the initial conditions given by the IGM. Forward-backward correlation is obtained in this way.Comment: 22 pages, RevTex, 8 figures (included); Invited paper to the special issue of Foundation of Physics dedicated to Mikio Namiki's 70th. birthda

    Distributed Computing Grid Experiences in CMS

    Get PDF
    The CMS experiment is currently developing a computing system capable of serving, processing and archiving the large number of events that will be generated when the CMS detector starts taking data. During 2004 CMS undertook a large scale data challenge to demonstrate the ability of the CMS computing system to cope with a sustained data-taking rate equivalent to 25% of startup rate. Its goals were: to run CMS event reconstruction at CERN for a sustained period at 25 Hz input rate; to distribute the data to several regional centers; and enable data access at those centers for analysis. Grid middleware was utilized to help complete all aspects of the challenge. To continue to provide scalable access from anywhere in the world to the data, CMS is developing a layer of software that uses Grid tools to gain access to data and resources, and that aims to provide physicists with a user friendly interface for submitting their analysis jobs. This paper describes the data challenge experience with Grid infrastructure and the current development of the CMS analysis system

    Some Findings Concerning Requirements in Agile Methodologies

    Get PDF
    gile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies

    Leading particle effect, inelasticity and the connection between average multiplicities in {\bf e+e−e^+e^-} and {\bf pppp} processes

    Full text link
    The Regge-Mueller formalism is used to describe the inclusive spectrum of the proton in ppp p collisions. From such a description the energy dependences of both average inelasticity and leading proton multiplicity are calculated. These quantities are then used to establish the connection between the average charged particle multiplicities measured in {\bf e+e−e^+e^-} and {\bf pp/pˉppp/{\bar p}p} processes. The description obtained for the leading proton cross section implies that Feynman scaling is strongly violated only at the extreme values of xFx_F, that is at the central region (xF≈0x_F \approx 0) and at the diffraction region (xF≈1x_F \approx 1), while it is approximately observed in the intermediate region of the spectrum.Comment: 20 pages, 10 figures, to be published in Physical Review
    • 

    corecore